Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Aug 28, 2019 · We perform data selection for multiple domains at once. This is achieved by carefully introducing instance-level domain-relevance features and ...
This is achieved by carefully introducing instance-level domain-relevance features and automatically constructing a training curriculum to gradually concentrate.
This work performs data selection for multiple domains at once by carefully introducing instance-level domain-relevance features and automatically ...
People also ask
Unsupervised Domain Clusters in Pretrained Language Models · Coupling Distant Annotation and Adversarial Training for Cross-Domain Chinese Word Segmentation.
Aug 28, 2019 · It computes multiple data selection scores for each training example, each score measuring how useful the example is to a certain task. It uses ...
May 2, 2020 · Learning a Multi-Domain Curriculum for Neural Machine Translation. Wei Wang. Google Research wangwe@google.com. Ye Tian. Google Research ytian ...
This work introduces a curriculum learning approach to adapt generic neural machine translation models to a specific domain and consistently outperforms ...
Multi-domain adaptation of neural machine translation (NMT) aims to learn a unified seq2seq framework based on multi-domain data. Domain corpus data mixing is ...
Learning a Multi-Domain Curriculum for Neural Machine Translation. W. Wang, Y. Tian, J. Ngiam, Y. Yang, I. Caswell, and Z. Parekh. ACL, page 7711-7723.
Jun 2, 2022 · Based on the re- sults of multiple experiments, we show that our method offer a generic framework to automatically handle several real-world ...